Columns

Why Pharma Needs to Go Multivariate

Aren’t pharmaceutical operations too complex for the industry to continue to rely on univariate thinking?

Author Image

By: Emil W. Ciurczak

Independent Pharmaceuticals Professional

I was just watching a health report on television. The topic? The potential impact of drinking more than two diet sodas a day on women over 50. The report was based on studying thousands of women in that age group and tracking them over a period of several decades.

There was an apparent increase in the incidence of heart disease, ovarian cancer and other illnesses in the group.  In some cases, the incidence was more than 50% higher than it was in the control group that didn’t drink diet soda. 

However, conclusions were made, without mentioning any other potentially contributing factors, such as weight, smoking, diet, economic status and access to health care. This approach: isolating a single cause and (apparent) effect has been used for many, many years on many, many studies.  Why do we still use it in so many pharma operations?

This question made me remember the scene from a Batman movie. The Penguin, running for mayor, notes that, while he is always in the company of law-enforcement personnel, Batman is always seen with criminals. The joke, of course, was that he was a criminal being arrested while Batman was arresting criminals. Surprisingly, simple correlation coefficients not too far removed from this sphere are used in all levels of the pharmaceutical industry. [Keep in mind, as my statistics professor pointed out, that a better than 0.9 correlation exists between the rise and fall of the Mississippi River and the birth rate in Paris, France.  How much can we trust simple one-to-one correlations?]

Why is this relevant to a pharmaceutical manufacturer? Well, it becomes extremely important as we move to actual process control via on-line and in-line instruments (PAT). An HPLC analysis, for example, is a “univariate” analysis, meaning you only get one piece of data; in this case, the chemicals within a single dosage form. Since you need to destroy the tablet, you no longer know its dissolution profile or other key information.

It also goes without saying that HPLC is hardly capable of giving real-time data for process understanding or control. [If you include sampling, labeling, transport, logging the sample, preparing standards, conditioning a column—after preparing the mobile phase, preparing the sample, filtering the sample, injecting the sample, assessing the results, and reporting the results, we could take a full day to handle an assay.]

To follow a process in real time, a fast, non-destructive analysis technique is needed. For solid dosage forms, this almost always means spectroscopy. The chief technologies that are used are near-infrared (NIR), Raman, and to a lesser extent, light-induced fluorescence (LIF). All these techniques are used on the complex mixture that is the dosage form.

These methods “see” everything in the mix, such as API(s), excipients, particle size(s), and tablet density. As a consequence, the spectrum contains far more than the chemical information needed for an API assay. The assay or API content is buried, and is often only a small contribution to the tablet spectrum.

More Work Than an HPLC Assay, But Worth It (Millisecond Measurements)Thus, if you only wish to follow the API content as the dosage forms are produced, you also need to account for the other parameters. This is often far more work than an HPLC assay; you need to measure hardness, weight, and thickness, among other parameters.

The first good news is that you can now glean the amount of API in milliseconds, not hours. This means you don’t have to make a few million tablets then find that you values are high/low or Content Uniformity is terrible. Adjustments can’t be made after the product is finished…except to throw it away (or, if you’re extremely lucky, rework it).

Of all the difficulties encountered in developing a process analysis, the hardest to overcome is convincing some of the people in your own Quality Assurance (QA) department, not those at the FDA or EMA. While the Agencies have accepted the concepts of ICH Q8, 9, 10, and 11 (they were, after all, part of ICH), the majority of QA departments (staffed with the people voted “most resistant to change” in their high school yearbooks) tend to drag their feet. This is because they have grown up in the “HPLC for everything” era, with all references to method validation planted firmly in the 1980s.

The most egregious concept is Range. The ICH Q2 (R1) Guidance for Validation specifies a range of 80 to 120% of label claim for assays and 70 to 130% for CU. These are fine for HPLC methods, since they don’t actually require dosage forms in this range, but merely that the LC method shows “linearity” (I’ll get to that fiasco later) in that range. This is done with a series of artificial solutions injected into an HPLC system. The whole calibration is done with synthetically made solutions. Unfortunately, to perform the process analysis for the in-process system, actual artificial tablets need to be pressed.

This is where the power of multi-variate rather than a univariate approach comes to the fore. Since most chemists and, apparently, all QA people, are merely looking at the amount of API, this seems like a good idea.

News Flash: Real Materials in a Real Setting Don’t Follow ICH Rules
Unfortunately, real materials in a real setting don’t follow ICH rules. To illustrate why  “common sense” is usually wrong in the multivariate world (much like quantum mechanics doesn’t seem logical in the macro-world), let me propose a thought problem (like one of my heroes, whose initials were A.E.).

Suppose you make a blend that is absolutely well-blended and does not separate during processing. The only way a tablet could have 70, 80, 90, 120, or 130% of label claim would be to…wait for it…make tablets from 70 to 130% of the weight. How likely is it that that could happen in the real world? I imagine that even the most inexpensive tablet-weighing device could detect a 130% error, no? So, to make these synthetic tablets for calibration (merely to please QA), we need to vary the ratio of API(s) to excipients, allowing the tablet weights to remain the same (really?).

Calibration Samples Won’t Represent the Product Accurately
One problem with that approach is that the APIs and excipients seldom have the same density; therefore, the same volume of granulation is admitted to the punches (because the tablet press is set to allow the same volume, over and over) may well not have the same mass. And, should you experience the rare case where the mass is equal, there is probably little chance that the new mixture will compress in the same manner.

Why does this matter? Remember that you are now looking at all the physical and chemical properties of the tablets? Well, compression differences immediately affect the density, hardness, and particle sizes of the ingredients.

Throw in weight differences and we do not, in actuality, have calibration samples that represent the product. If we use a different tablet press, the differences are even greater1,2.

What was found was that accuracy and precision were not improved by merely extending the range. In almost every case investigated, the only statistic improved by extending the range was the correlation coefficient. That alone should show that R or R2 are “feel good” numbers that may or may not have relevance in the validation of the method. A further discussion is included in the references.

My point is that traditional analytical methodology is not suited for process control. A lot of pharmaceutical professionals are fearful of things that are different for a number of reasons.

While just about every Agency and process engineer has been convinced of the value of real-time analysis and control, the (traditionally) most conservative group still needs to be dragged (kicking and screaming, apparently) into the 21st century. Now, repeat after me: “Change is good.  Change is good. Change is good…”

References

  1. E.W. Ciurczak, G.E. Ritchie, R. Roller, H. Mark, C. Tso, and S. MacDonald.  “Validation of a NIR Transmission Spectroscopic Procedure, Part A: Validation Protocols,” J. Pharm Biomed Anal 28 (2), 251-260 (2002).
  2. E.W. Ciurczak, G.E. Ritchie, R. Roller, H. Mark, C. Tso, and S. MacDonald. “Validation of a NIR Transmission Spectroscopic Procedure, Part B: Application to Alternate Content Uniformity and Release Assay Methods for Pharmaceutical Dosage Forms,” J. Pharm. Biomed. Anal., 29 (1-2), 159-171(2002).

Emil W. Ciurczak
DoraMaxx Consulting

Emil W. Ciurczak has worked in the pharmaceutical industry since 1970 for companies that include Ciba-Geigy, Sandoz, Berlex, Merck, and Purdue Pharma, where he specialized in performing method development on most types of analytical equipment. In 1983, he introduced NIR spectroscopy to pharmaceutical applications, and is generally credited as one of the first to use process analytical technologies (PAT) in drug manufacturing and development.

Keep Up With Our Content. Subscribe To Contract Pharma Newsletters